AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
High parameter density

# High parameter density

Esotericknowledge 24B
This is a 24B-parameter merged language model, utilizing the TIES method to fuse multiple 24B-scale pre-trained models, focusing on providing high-quality text generation and comprehension capabilities.
Large Language Model Transformers
E
yamatazen
122
4
Dazzling Star Aurora 32b V0.0 Experimental 1130
A fusion model based on Qwen2.5-32B, utilizing TIES technique to merge multiple 32B-parameter-scale models, excelling in text generation and comprehension.
Large Language Model Transformers
D
LyraNovaHeart
36
5
L3 SthenoMaidBlackroot 8B V1
This is an 8B-parameter-scale language model merged using the mergekit tool, based on Sao10K/L3-8B-Stheno-v3.2 as the foundation model, incorporating features from Jamet-8B-L3 and Llama-3-Lumimaid-8B
Large Language Model Transformers
L
bluuwhale
163
25
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase